The Diameter of Nearest Neighbor Graphs
نویسنده
چکیده
Any connected plane nearest neighbor graph has diameter Ω(n). This bound generalizes to Ω(n) in any dimension d. For any set of n points in the plane, we define the nearest neighbor graph by selecting a unique nearest neighbor for each point, and adding an edge between each point and its neighbor. This is a directed graph with outdegree one; thus it is a pseudo-forest. Each component of the pseudo-forest is a tree, with a length-two directed cycle at the root. As with minimum spanning trees, the maximum degree in a nearest neighbor graph is five. Monma and Suri [1] showed that, conversely, any tree with vertex degree at most five is the minimum spanning tree of some point set; thus minimum spanning tree topologies are exactly characterized by their degrees. Paterson and Yao [2] considered the corresponding question for nearest neighbor graphs. They showed that a tree with depth D can have at most O(D9) vertices. Thus unlike minimum spanning trees, nearest neighbor graphs can not be too bushy: a tree with many vertices must contain a long path. Paterson and Yao also constructed an example of a nearest neighbor graph with m points and Ω(D5) vertices. There remains a large gap between D5 and D9, and we are left with the question of the exact relation between depth and size of nearest neighbor graphs. In this paper we tighten this gap, by demonstrating that a nearest neighbor graph with diameter m can have at most O(D6) points. Equivalently, a nearest neighbor graph with n points must have diameter Ω(n1/6). It is possible that the insight into this problem provided by our proof can remove the remaining gap, by showing what a point set must look like if its nearest neighbor graph is to have diameter O(n1/6), or alternately by leading the way to a proof that the diameter must be Ω(n1/5). Our proof that there can be O(D6) points in a nearest neighbor graph with diameter D follows the same general outline as Paterson and Yao’s proof of their O(D9) bound, so we summarize that outline here. Paterson and Yao partition the plane into an infinite sequence of similar annuli, centered on the origin which is chosen to lie on the root of the nearest neighbor tree. The outer radius of each annulus is some suitably large constant C times the inner radius. Each point is assigned to the annulus containing the outermost point on the path from the point to the origin in the nearest neighbor graph. (Most points are assigned to the annulus containing them, but a point may be assigned to a larger annulus if its nearest neighbor path goes away from the origin before returning.) They then count the number of points that can be assigned to any one annulus, and the number of possible annuli.
منابع مشابه
EFFECT OF THE NEXT-NEAREST NEIGHBOR INTERACTION ON THE ORDER-DISORDER PHASE TRANSITION
In this work, one and two-dimensional lattices are studied theoretically by a statistical mechanical approach. The nearest and next-nearest neighbor interactions are both taken into account, and the approximate thermodynamic properties of the lattices are calculated. The results of our calculations show that: (1) even though the next-nearest neighbor interaction may have an insignificant ef...
متن کاملEvaluation Accuracy of Nearest Neighbor Sampling Method in Zagross Forests
Collection of appropriate qualitative and quantitative data is necessary for proper management and planning. Used the suitable inventory methods is necessary and accuracy of sampling methods dependent the inventory net and number of sample point. Nearest neighbor sampling method is a one of distance methods and calculated by three equations (Byth and Riple, 1980; Cotam and Curtis, 1956 and Cota...
متن کاملAsymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data
Kernel density estimators are the basic tools for density estimation in non-parametric statistics. The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in which the bandwidth is varied depending on the location of the sample points. In this paper, we initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...
متن کاملEvaluation Accuracy of Nearest Neighbor Sampling Method in Zagross Forests
Collection of appropriate qualitative and quantitative data is necessary for proper management and planning. Used the suitable inventory methods is necessary and accuracy of sampling methods dependent the inventory net and number of sample point. Nearest neighbor sampling method is a one of distance methods and calculated by three equations (Byth and Riple, 1980; Cotam and Curtis, 1956 and Cota...
متن کاملUsing the Mutual k-Nearest Neighbor Graphs for Semi-supervised Classification on Natural Language Data
The first step in graph-based semi-supervised classification is to construct a graph from input data. While the k-nearest neighbor graphs have been the de facto standard method of graph construction, this paper advocates using the less well-known mutual k-nearest neighbor graphs for high-dimensional natural language data. To compare the performance of these two graph construction methods, we ru...
متن کامل